Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities

نویسندگان

چکیده

Abstract In certain polytopal domains $$\varOmega $$ ? , in space dimension $$d=2,3$$ d = 2 , 3 we prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) $$H^1(\varOmega )$$ H 1 ( ) for weighted analytic function classes. These classes comprise particular solution sets of source and eigenvalue problems elliptic PDEs data. Functions these are locally on open subdomains $$D\subset \varOmega D ? but may exhibit isolated point singularities the interior or corner edge at boundary $$\partial ? . The approximation rates shown to hold $$d = 2$$ Lipschitz polygons straight sides, $$d=3$$ Fichera-type polyhedral plane faces. constructive proofs indicate that NN depth size increase poly-logarithmically respect target accuracy $$\varepsilon >0$$ ? > 0 results cover linear, second-order data nonlinear nonlinearities singular, potentials as arise electron structure models. Here, functions correspond densities nuclei.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quad-pixel edge detection using neural network

One of the most fundamental features of digital image and the basic steps in image processing, analysis, pattern recognition and computer vision is the edge of an image where the preciseness and reliability of its results will affect directly on the comprehension machine system made objective world. Several edge detectors have been developed in the past decades, although no single edge detector...

متن کامل

Quad-pixel edge detection using neural network

One of the most fundamental features of digital image and the basic steps in image processing, analysis, pattern recognition and computer vision is the edge of an image where the preciseness and reliability of its results will affect directly on the comprehension machine system made objective world. Several edge detectors have been developed in the past decades, although no single edge detector...

متن کامل

Bounds on rates of variable-basis and neural-network approximation

Tightness of bounds on rates of approximation by feedforward neural networks is investigated in a more general context of nonlinear approximation by variable-basis functions. Tight bounds on the worst case error in approximation by linear combinations of elements of an orthonormal variable basis are derived.

متن کامل

tight frame approximation for multi-frames and super-frames

در این پایان نامه یک مولد برای چند قاب یا ابر قاب تولید شده تحت عمل نمایش یکانی تصویر برای گروه های شمارش پذیر گسسته بررسی خواهد شد. مثال هایی از این قاب ها چند قاب های گابور، ابرقاب های گابور و قاب هایی برای زیرفضاهای انتقال پایاست. نشان می دهیم که مولد چند قاب تنک نرمال شده (ابرقاب) یکتا وجود دارد به طوری که مینیمم فاصله را از ان دارد. همچنین مسایل مشابه برای قاب های دوگان مطرح شده و برخی ...

15 صفحه اول

Improving performance of recurrent neural network with relu nonlinearity

In recent years significant progress has been made in successfully training recurrent neural networks (RNNs) on sequence learning problems involving long range temporal dependencies. The progress has been made on three fronts: (a) Algorithmic improvements involving sophisticated optimization techniques, (b) network design involving complex hidden layer nodes and specialized recurrent layer conn...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Foundations of Computational Mathematics

سال: 2022

ISSN: ['1615-3383', '1615-3375']

DOI: https://doi.org/10.1007/s10208-022-09565-9